Feature Preserving Point Set Surfaces based on Non-Linear Kernel regression

ثبت نشده
چکیده

1. Briefly summarize the paper's contributions. Does it address a new problem? Does it present a new approach? Does it show new types of results?  [AS] This paper presents a new point set surface representation by combining implicit moving least squares (IMLS) with robust statistics. In particular, they explicitly derive IMLS in the local kernel regression (LKR) framework, forming the basis of a new robust moving least squares (MLS) surface representation that can handle sparse sampling, preserves details, and handles sharp features.  [DS] The paper presents a method that applies a robust statistics methods to MLS implicit surface. They use a robust kernel regression, which results in a representation that can handle sparse sampling, generates a continuous surface that preserves fine details better, and can handle sharp features with controllable sharpness. It can handle outliers and high frequency features, it is efficient and easy to implement, and it uses local computations so no preprocessing is needed. Improves representation of sharp features and details of any frequency, and can deal with high order corners and peaks.  [FP] The authors implicitly define the surface using a Robust Local Kernel Regression technique. The approach uses a first order estimation of the Local Kernel, which is reduced to a zero order estimation by approximating the gradient with the normal at each sample point. The obtained Local Kernel is equivalent to Kolluri's IMLS implicit function. The main contribution is the introduction of an efficient and robust technique which is resistant to both spatial and normal outliers. This robust approach allows the reconstruction of sharp features without needing to decompose or tag a local neighborhood.  [JD]  [LF]  [MK] The paper proposes interpreting the (implicit) moving least squares approach in the context of local kernel regression. This allows for designing a more robust algorithm that minimizes an energy that is less sensitive to outliers. Additionally, the authors show that by effectively incorporating a bilateral term that penalizes normal variation, their method can be made to better adapt to sharp features. 2. What is the key insight of the paper? (in 1-2 sentences)  [AS] The key insight of this paper is that MLS surfaces can be expressed in terms of LKR, which is a method in statistics to estimate the conditional expectation of a random variable.  [DS] The key insight is the fact that MLS surfaces can be expressed in …

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature Preserving Point Set Surfaces based on Non-Linear Kernel Regression

Moving least squares (MLS) is a very attractive tool to design effective meshless surface representations. However, as long as approximations are performed in a least square sense, the resulting definitions remain sensitive to outliers, and smooth-out small or sharp features. In this paper, we address these major issues, and present a novel point based surface definition combining the simplicit...

متن کامل

Hyperspectral Image Classification Based on the Fusion of the Features Generated by Sparse Representation Methods, Linear and Non-linear Transformations

The ability of recording the high resolution spectral signature of earth surface would be the most important feature of hyperspectral sensors. On the other hand, classification of hyperspectral imagery is known as one of the methods to extracting information from these remote sensing data sources. Despite the high potential of hyperspectral images in the information content point of view, there...

متن کامل

A Geometry Preserving Kernel over Riemannian Manifolds

Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...

متن کامل

An infeasible interior-point method for the $P*$-matrix linear complementarity problem based on a trigonometric kernel function with full-Newton step

An infeasible interior-point algorithm for solving the$P_*$-matrix linear complementarity problem based on a kernelfunction with trigonometric barrier term is analyzed. Each (main)iteration of the algorithm consists of a feasibility step andseveral centrality steps, whose feasibility step is induced by atrigonometric kernel function. The complexity result coincides withthe best result for infea...

متن کامل

An interior-point algorithm for $P_{ast}(kappa)$-linear complementarity problem based on a new trigonometric kernel function

In this paper, an interior-point algorithm  for $P_{ast}(kappa)$-Linear Complementarity Problem (LCP) based on a new parametric trigonometric kernel function is proposed. By applying strictly feasible starting point condition and using some simple analysis tools, we prove that our algorithm has $O((1+2kappa)sqrt{n} log nlogfrac{n}{epsilon})$ iteration bound for large-update methods, which coinc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013